# Reading Comprehension Task
My Awesome Qa Model
Apache-2.0
A question-answering model fine-tuned on the SQuAD dataset based on bert-base-multilingual-cased
Question Answering System
Transformers

M
vnktrmnb
14
0
Bert Finetuned Squad
Apache-2.0
A QA model fine-tuned based on bert-base-cased, optimized on the SQuAD dataset
Question Answering System
Transformers

B
dpkmnit
13
0
Distilbert Base Cased Distilled Squad Finetuned Squad
Apache-2.0
This model is a fine-tuned version of distilbert-base-cased-distilled-squad, suitable for question-answering tasks
Question Answering System
Transformers

D
ms12345
14
0
Bert Large Uncased Squadv1.1 Sparse 80 1x4 Block Pruneofa
Apache-2.0
This model is obtained by fine-tuning a pre-trained 80% 1x4 block sparse Prune OFA BERT-Large model through knowledge distillation, demonstrating excellent performance on the SQuADv1.1 Q&A task.
Question Answering System
Transformers English

B
Intel
15
1
Albert Bahasa Uncased Squad
IndoBERT-Lite model fine-tuned using a translated version of the SQuAD dataset based on IndoBenchmark, suitable for Indonesian Q&A tasks.
Question Answering System
Transformers Other

A
Wikidepia
40
0
Distilbert Onnx
Apache-2.0
This is a question-answering model fine-tuned on the SQuAD v1.1 dataset using knowledge distillation techniques, based on the DistilBERT-base-cased model.
Question Answering System
Transformers English

D
philschmid
8,650
2
Featured Recommended AI Models